s-Goodness for Low-Rank Matrix Recovery

نویسندگان

  • Lingchen Kong
  • Levent Tunçel
  • Naihua Xiu
  • Jein-Shan Chen
چکیده

and Applied Analysis 3 If there does not exist such y for some X as above, we set γ s (A, β) = +∞ and to be compatible with the special case given by [24] we write γ s (A), γ s (A) instead of γ s (A, +∞), γ s (A, +∞), respectively. From the above definition, we easily see that the set of values that γ takes is closed. Thus, when γ s (A, β) < +∞, for every matrix X ∈ Rm×n with s nonzero singular values, all equal to 1, there exists a vector y ∈ Rp such that 󵄩 󵄩 󵄩 󵄩 y 󵄩 󵄩 󵄩 󵄩d ≤ β, σ i (A ∗ y){ = 1, if σ i (X) = 1, ∈ [0, γ s (A, β)] , if σ i (X) = 0, i ∈ {1, 2, . . . , r} . (10) Similarly, for every matrixX ∈ Rm×n with s nonzero singular values, all equal to 1, there exists a vector y ∈ Rp such thatAy and X share the same orthogonal row and column spaces: 󵄩 󵄩 󵄩 󵄩 y 󵄩 󵄩 󵄩 󵄩d ≤ β, 󵄩 󵄩 󵄩 󵄩 A ∗ y − X 󵄩 󵄩 󵄩 󵄩 ≤ γ s (A, β) . (11) Observing that the set {A∗y : ‖y‖ d ≤ β} is convex, we obtain that if γ s (A, β) < +∞ then for every matrix X with at most s nonzero singular values and ‖X‖ ≤ 1 there exist vectors y satisfying (10) and there exist vectors y satisfying (11). 2.2. Basic Properties of G-Numbers. In order to characterize the s-goodness of a linear transformation A, we study the basic properties of G-numbers. We begin with the result that G-numbers γ s (A, β) and γ s (A, β) are convex nonincreasing functions of β. Proposition 3. For every linear transformation A and every s ∈ {0, 1, . . . , r}, G-numbers γ s (A, β) and γ s (A, β) are convex nonincreasing functions of β ∈ [0, +∞]. Proof. We only need to demonstrate that the quantity γ s (A, β) is a convex nonincreasing function ofβ ∈ [0, +∞]. It is evident from the definition that γ s (A, β) is nonincreasing for given A, s. It remains to show that γ s (A, β) is a convex function of β. In other words, for every pair β 1 , β 2 ∈ [0, +∞], we need to verify that γ s (A, αβ 1 + (1 − α) β 2 ) ≤ αγ s (A, β 1 ) + (1 − α) γ s (A, β 2 ) , ∀α ∈ [0, 1] . (12) The above inequality follows immediately if one of β 1 , β 2 is +∞. Thus, we may assume β 1 , β 2 ∈ [0, +∞). In fact, from the argument around (10) and the definition of γ s (A, ⋅), we know that for every matrix X = UDiag(σ(X))V with s nonzero singular values, all equal to 1, there exist vectors y 1 , y 2 ∈ Rp such that for k ∈ {1, 2} 󵄩 󵄩 󵄩 󵄩 y k 󵄩 󵄩 󵄩 󵄩d ≤ β k , σ i (A ∗ y k ) { = 1, if σ i (X) = 1, ∈ [0, γ s (A, β k )] , if σ i (X) = 0, i ∈ {1, 2, . . . , r} . (13) It is immediate from (13) that ‖αy 1 + (1 − α)y 2 ‖ d ≤ αβ 1 +(1− α)β 2 . Moreover, from the above information on the singular values ofAy 1 ,Ay 2 , we may setAy k = X + Y k , k ∈ {1, 2} such that

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sufficient Conditions for Low-rank Matrix Recovery, Translated from Sparse Signal Recovery

The low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is NP-hard and a popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, we ...

متن کامل

S-goodness for Low-rank Matrix Recovery, Translated from Sparse Signal Recovery

Low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is generally NP-hard. A popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, ...

متن کامل

Sparse Recovery on Euclidean Jordan Algebras

We consider the sparse recovery problem on Euclidean Jordan algebra (SREJA), which includes sparse signal recovery and low-rank symmetric matrix recovery as special cases. We introduce the restricted isometry property, null space property (NSP), and s-goodness for linear transformations in s-sparse element recovery on Euclidean Jordan algebra (SREJA), all of which provide sufficient conditions ...

متن کامل

S-semigoodness for Low-Rank Semidefinite Matrix Recovery

We extend and characterize the concept of s-semigoodness for a sensing matrix in sparse nonnegative recovery (proposed by Juditsky , Karzan and Nemirovski [Math Program, 2011]) to the linear transformations in low-rank semidefinite matrix recovery. We show that ssemigoodness is not only a necessary and sufficient condition for exact s-rank semidefinite matrix recovery by a semidefinite program,...

متن کامل

On Low Rank Matrix Approximations with Applications to Synthesis Problem in Compressed Sensing

We consider the synthesis problem of Compressed Sensing –given s and an M×n matrix A, extract from it an m × n submatrix Am, certified to be s-good, with m as small as possible. Starting from the verifiable sufficient conditions of s-goodness, we express the synthesis problem as the problem of approximating a given matrix by a matrix of specified low rank in the uniform norm. We propose randomi...

متن کامل

Rank-Sparsity Incoherence for Matrix Decomposition

Suppose we are given a matrix that is formed by adding an unknown sparse matrix to an unknown low-rank matrix. Our goal is to decompose the given matrix into its sparse and low-rank components. Such a problem arises in a number of applications in model and system identification, and is NP-hard in general. In this paper we consider a convex optimization formulation to splitting the specified mat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014